Hierarchical Regularization Cascade
نویسندگان
چکیده
We present a hierarchical approach for information sharing among different classification tasks, in multitask and multi-class settings. We propose a top-down iterative method, which begins by posing an optimization problem with an incentive for large scale sharing among all classes. This incentive to share is gradually decreased, until there is no sharing and all tasks are considered separately. The method therefore exploits different levels of sharing within a given group of related tasks, without having to make hard decisions about the grouping of tasks. In order to deal with large scale problems, with many tasks and many classes, we extend our batch approach to online setting and provide regret analysis of the algorithm. Based on the structure of shared information discovered in the joint learning settings, we propose two different knowledge-transfer methods for learning novel tasks. The methods are designed to work within the very challenging large scale settings. We tested our methods extensively on synthetic and real datasets, showing significant improvement over baseline and state-of-the-art methods.
منابع مشابه
Hierarchical Regularization Cascade for Joint Learning
for some constant M > 1. This assumption is justified if the accumulated cost obtained by the online algorithm is larger than the cost obtained by the optimal batch algorithm for most of the training examples. We argue that if this assumption is violated then the online algorithm is revealed to be a very good performer, and it therefore makes little sense to judge its performance by the deviati...
متن کاملConstructive Neural Networks with Regularization
In this paper we present a regularization approach to the training of all the network weights in cascadecorrelation type constructive neural networks. Especially, the case of regularizing the output neuron of the network is presented. In this case, the output weights are trained by employing a regularized objective function containing a penalty term which is proportional to the weight values of...
متن کاملHierarchical Regularization Cascade for Joint Learning
As the sheer volume of available benchmark datasets increases, the problem of joint learning of classifiers and knowledge-transfer between classifiers, becomes more and more relevant. We present a hierarchical approach which exploits information sharing among different classification tasks, in multitask and multi-class settings. It engages a top-down iterative method, which begins by posing an ...
متن کاملHierarchical Feature Selection with Recursive Regularization
In the big data era, the sizes of datasets have increased dramatically in terms of the number of samples, features, and classes. In particular, there exists usually a hierarchical structure among the classes. This kind of task is called hierarchical classification. Various algorithms have been developed to select informative features for flat classification. However, these algorithms ignore the...
متن کاملRegularization Framework for Large Scale Hierarchical Classification
In this paper, we propose a hierarchical regularization framework for large-scale hierarchical classification. In our framework, we use the regularization structure to share information across the hierarchy and enforce similarity between class-parameters that are located nearby in the hierarchy. To address the computational issues that arise, we propose a parallel-iterative optimization scheme ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2012